翻訳と辞書
Words near each other
・ Limnaecia heterozona
・ Limnaecia ichnographa
・ Limnaecia ida
・ Limnaecia inconcinna
・ Limnaecia iriastis
・ Limnaecia isodesma
・ Limnaecia isozona
・ Limnaecia leptomeris
・ Limnaecia leptozona
・ Limithana
・ Limiting
・ Limiting case
・ Limiting case (mathematics)
・ Limiting case (philosophy of science)
・ Limiting current
Limiting density of discrete points
・ Limiting factor
・ Limiting magnitude
・ Limiting oxygen concentration
・ Limiting oxygen index
・ Limiting parallel
・ Limiting point
・ Limiting point (geometry)
・ Limiting pressure velocity
・ Limiting reagent
・ Limiting similarity
・ Limitless
・ Limitless (Burns song)
・ Limitless (disambiguation)
・ Limitless (EP)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Limiting density of discrete points : ウィキペディア英語版
Limiting density of discrete points

In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy.
It was formulated by Edwin Thompson Jaynes to address defects in the initial definition of differential entropy.
== Definition ==
Shannon originally wrote down the following formula for the entropy of a continuous distribution, known as differential entropy:
: H(X)=-\int p(x)\log p(x)\,dx.
Unlike Shannon's formula for the discrete entropy, however, this is not the result of any derivation (Shannon simply replaced the summation symbol in the discrete version with an integral) and it turns out to lack many of the properties that make the discrete entropy a useful measure of uncertainty. In particular, it is not invariant under a change of variables and can even become negative.
Jaynes (1963, 1968) argued that the formula for the continuous entropy should be derived by taking the limit of increasingly dense discrete distributions. Suppose that we have a set of n discrete points \, such that in the limit n \to \infty their density approaches a function m(x) called the "invariant measure".
: \lim_\frac\,(\mboxa
Jaynes derived from this the following formula for the continuous entropy, which he argued should be taken as the correct formula:
: H(X)=-\int p(x)\log\frac\,dx.
It is similar to the (negative of the) Kullback–Leibler divergence or relative entropy, which is a comparison between two probability distributions, with one difference. In the Kullback-Leibler divergence, m(x) must be a probability density, whereas in Jaynes' formula, m(x) is simply a density, meaning that it does not have to integrate to 1.
Jaynes' continuous entropy formula has the property of being invariant under a change of variables, provided that m(x) and p(x) are transformed in the same way. (This motivates the moniker "invariant measure" for ''m''.) This solves many of the difficulties that come from applying Shannon's continuous entropy formula.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Limiting density of discrete points」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.